PolyLU: A simple and robust polynomial-based linear unit activation function for deep learning

نویسندگان

چکیده

The activation function has a critical influence on whether convolutional neural network in deep learning can converge or not; proper not only makes the faster but also reduce complexity of architecture and gets same better performance. Many functions have been proposed; however, various advantages, defects, applicable architectures. A new called Polynomial Linear Unit (PolyLU) is proposed this paper to improve some shortcomings existing functions. PolyLU meets following basic properties: continuously differentiable, approximate identity near origin, unbounded for positive inputs, bounded negative smooth, monotonic, zero-centered. There polynomial term inputs no exponential terms that reduces computational network. Compared those common like Sigmoid, Tanh, ReLU, LeakyReLU, ELU, Mish, Swish, experiments show improved accuracy over MNIST, Kaggle Cats Dogs, CIFAR-10 CIFAR-100 datasets. Test by dataset with batch normalization, improves 0.62%, 2.82%, 2.44%, 1.33%, 2.08%, 4.26% than Leaky Tanh respectively. without 1.24%, 4.39%, 2.12%, 5.43%, 15.51%, 8.10%

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Deep learning-based CAD systems for mammography: A review article

Breast cancer is one of the most common types of cancer in women. Screening mammography is a low‑dose X‑ray examination of breasts, which is conducted to detect breast cancer at early stages when the cancerous tumor is too small to be felt as a lump. Screening mammography is conducted for women with no symptoms of breast cancer, for early detection of cancer when the cancer is most treatable an...

متن کامل

A polynomial-time algorithm for linear optimization based on a new simple kernel function

We present a new barrier function, based on a kernel function with a linear growth term and an inverse linear barrier term. Existing kernel functions have a quadratic (or higher degree) growth term, and a barrier term that is either transcendent (e.g. logarithmic) or of a more complicated algebraic form. So the new kernel function has the simplest possible form compared with all existing kernel...

متن کامل

a new type-ii fuzzy logic based controller for non-linear dynamical systems with application to 3-psp parallel robot

abstract type-ii fuzzy logic has shown its superiority over traditional fuzzy logic when dealing with uncertainty. type-ii fuzzy logic controllers are however newer and more promising approaches that have been recently applied to various fields due to their significant contribution especially when the noise (as an important instance of uncertainty) emerges. during the design of type- i fuz...

15 صفحه اول

A Very Simple Polynomial - Time Algorithm for Linear Programming

In this note we propose a polynomial-time algorithm for linear programming. This algorithm augments the objective by a logarithmic penalty function and then solves a sequence of quadratic approximations of this program. This algorithm has a complexity of O(ml/2 -L) iterations and O(m 3 .5.L) arithmetic operations, where m is the number of variables and L is the size of the problem encoding in b...

متن کامل

Linear relaxations of polynomial positivity for polynomial Lyapunov function synthesis

We examine linear programming (LP) based relaxations for synthesizing polynomial Lyapunov functions to prove the stability of polynomial ODEs. Our approach starts from a desired parametric polynomial form of the polynomial Lyapunov function. Subsequently, we encode the positive-definiteness of the function, and the negation of its derivative, over the domain of interest. We first compare two cl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Access

سال: 2023

ISSN: ['2169-3536']

DOI: https://doi.org/10.1109/access.2023.3315308